117 research outputs found
Compressive Sensing DNA Microarrays
Compressive sensing microarrays (CSMs) are DNA-based sensors that operate using group testing and compressive sensing (CS) principles. In contrast to conventional DNA microarrays, in which each genetic sensor is designed to respond to a single target, in a CSM, each sensor responds to a set of targets. We study the problem of designing CSMs that simultaneously account for both the constraints from CS theory and the biochemistry of probe-target DNA hybridization. An appropriate cross-hybridization model is proposed for CSMs, and several methods are developed for probe design and CS signal recovery based on the new model. Lab experiments suggest that in order to achieve accurate hybridization profiling, consensus probe sequences are required to have sequence homology of at least 80% with all targets to be detected. Furthermore, out-of-equilibrium datasets are usually as accurate as those obtained from equilibrium conditions. Consequently, one can use CSMs in applications in which only short hybridization times are allowed
Adaptive Measurement Network for CS Image Reconstruction
Conventional compressive sensing (CS) reconstruction is very slow for its
characteristic of solving an optimization problem. Convolu- tional neural
network can realize fast processing while achieving compa- rable results. While
CS image recovery with high quality not only de- pends on good reconstruction
algorithms, but also good measurements. In this paper, we propose an adaptive
measurement network in which measurement is obtained by learning. The new
network consists of a fully-connected layer and ReconNet. The fully-connected
layer which has low-dimension output acts as measurement. We train the
fully-connected layer and ReconNet simultaneously and obtain adaptive
measurement. Because the adaptive measurement fits dataset better, in contrast
with random Gaussian measurement matrix, under the same measuremen- t rate, it
can extract the information of scene more efficiently and get better
reconstruction results. Experiments show that the new network outperforms the
original one.Comment: 11pages,8figure
Perceptual Compressive Sensing
Compressive sensing (CS) works to acquire measurements at sub-Nyquist rate
and recover the scene images. Existing CS methods always recover the scene
images in pixel level. This causes the smoothness of recovered images and lack
of structure information, especially at a low measurement rate. To overcome
this drawback, in this paper, we propose perceptual CS to obtain high-level
structured recovery. Our task no longer focuses on pixel level. Instead, we
work to make a better visual effect. In detail, we employ perceptual loss,
defined on feature level, to enhance the structure information of the recovered
images. Experiments show that our method achieves better visual results with
stronger structure information than existing CS methods at the same measurement
rate.Comment: Accepted by The First Chinese Conference on Pattern Recognition and
Computer Vision (PRCV 2018). This is a pre-print version (not final version
Estimating the number of components of a multicomponent nonstationary signal using the short-term time-frequency RĂ©nyi entropy
This article proposes a method for estimating the local number of signals components using the short term RĂ©nyi entropy of signals in the time-frequency plane.
(Additional details can be found in the comprehensive book on Time-Frequency Signal Analysis and Processing (see http://www.elsevier.com/locate/isbn/0080443354).
In addition, the most recent upgrade of the original software package that calculates Time-Frequency Distributions and Instantaneous Frequency estimators can be downloaded from the web site: www.time-frequency.net. This was the first software developed in the field, and it was first released publicly in 1987 at the 1st ISSPA conference held in Brisbane, Australia, and then continuously updated).The time-frequency RĂ©nyi entropy provides a measure of complexity of a nonstationary multicomponent signal in the time-frequency plane. When the complexity of a signal corresponds to the number of its components, then this information is measured as the RĂ©nyi entropy of the time-frequency distribution (TFD) of the signal. This article presents a solution to the problem of detecting the number of components that are present in short-time interval of the signal TFD, using the short-term RĂ©nyi entropy. The method is automatic and it does not require a prior information about the signal. The algorithm is applied on both synthetic and real data, using a quadratic separable kernel TFD. The results confirm that the short-term RĂ©nyi entropy can be an effective tool for estimating the local number of components present in the signal. The key aspect of selecting a suitable TFD is also discussed
Incremental dimension reduction of tensors with random index
We present an incremental, scalable and efficient dimension reduction
technique for tensors that is based on sparse random linear coding. Data is
stored in a compactified representation with fixed size, which makes memory
requirements low and predictable. Component encoding and decoding are performed
on-line without computationally expensive re-analysis of the data set. The
range of tensor indices can be extended dynamically without modifying the
component representation. This idea originates from a mathematical model of
semantic memory and a method known as random indexing in natural language
processing. We generalize the random-indexing algorithm to tensors and present
signal-to-noise-ratio simulations for representations of vectors and matrices.
We present also a mathematical analysis of the approximate orthogonality of
high-dimensional ternary vectors, which is a property that underpins this and
other similar random-coding approaches to dimension reduction. To further
demonstrate the properties of random indexing we present results of a synonym
identification task. The method presented here has some similarities with
random projection and Tucker decomposition, but it performs well at high
dimensionality only (n>10^3). Random indexing is useful for a range of complex
practical problems, e.g., in natural language processing, data mining, pattern
recognition, event detection, graph searching and search engines. Prototype
software is provided. It supports encoding and decoding of tensors of order >=
1 in a unified framework, i.e., vectors, matrices and higher order tensors.Comment: 36 pages, 9 figure
International Consensus Statement on Rhinology and Allergy: Rhinosinusitis
Background: The 5 years since the publication of the first International Consensus Statement on Allergy and Rhinology: Rhinosinusitis (ICARâRS) has witnessed foundational progress in our understanding and treatment of rhinologic disease. These advances are reflected within the more than 40 new topics covered within the ICARâRSâ2021 as well as updates to the original 140 topics. This executive summary consolidates the evidenceâbased findings of the document. Methods: ICARâRS presents over 180 topics in the forms of evidenceâbased reviews with recommendations (EBRRs), evidenceâbased reviews, and literature reviews. The highest grade structured recommendations of the EBRR sections are summarized in this executive summary. Results: ICARâRSâ2021 covers 22 topics regarding the medical management of RS, which are grade A/B and are presented in the executive summary. Additionally, 4 topics regarding the surgical management of RS are grade A/B and are presented in the executive summary. Finally, a comprehensive evidenceâbased management algorithm is provided. Conclusion: This ICARâRSâ2021 executive summary provides a compilation of the evidenceâbased recommendations for medical and surgical treatment of the most common forms of RS
- âŠ